Search Results

Documents authored by Pass, Rafael


Document
The Non-Uniform Perebor Conjecture for Time-Bounded Kolmogorov Complexity Is False

Authors: Noam Mazor and Rafael Pass

Published in: LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)


Abstract
The Perebor (Russian for "brute-force search") conjectures, which date back to the 1950s and 1960s are some of the oldest conjectures in complexity theory. The conjectures are a stronger form of the NP ≠ P conjecture (which they predate) and state that for "meta-complexity" problems, such as the Time-bounded Kolmogorov complexity Problem, and the Minimum Circuit Size Problem, there are no better algorithms than brute force search. In this paper, we disprove the non-uniform version of the Perebor conjecture for the Time-Bounded Kolmogorov complexity problem. We demonstrate that for every polynomial t(⋅), there exists of a circuit of size 2^{4n/5+o(n)} that solves the t(⋅)-bounded Kolmogorov complexity problem on every instance. Our algorithm is black-box in the description of the Universal Turing Machine U employed in the definition of Kolmogorov Complexity and leverages the characterization of one-way functions through the hardness of the time-bounded Kolmogorov complexity problem of Liu and Pass (FOCS'20), and the time-space trade-off for one-way functions of Fiat and Naor (STOC'91). We additionally demonstrate that no such black-box algorithm can have circuit size smaller than 2^{n/2-o(n)}. Along the way (and of independent interest), we extend the result of Fiat and Naor and demonstrate that any efficiently computable function can be inverted (with probability 1) by a circuit of size 2^{4n/5+o(n)}; as far as we know, this yields the first formal proof that a non-trivial circuit can invert any efficient function.

Cite as

Noam Mazor and Rafael Pass. The Non-Uniform Perebor Conjecture for Time-Bounded Kolmogorov Complexity Is False. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 80:1-80:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{mazor_et_al:LIPIcs.ITCS.2024.80,
  author =	{Mazor, Noam and Pass, Rafael},
  title =	{{The Non-Uniform Perebor Conjecture for Time-Bounded Kolmogorov Complexity Is False}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{80:1--80:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.80},
  URN =		{urn:nbn:de:0030-drops-196088},
  doi =		{10.4230/LIPIcs.ITCS.2024.80},
  annote =	{Keywords: Kolmogorov complexity, perebor conjecture, function inversion}
}
Document
Leakage-Resilient Hardness vs Randomness

Authors: Yanyi Liu and Rafael Pass

Published in: LIPIcs, Volume 264, 38th Computational Complexity Conference (CCC 2023)


Abstract
A central open problem in complexity theory concerns the question of whether all efficient randomized algorithms can be simulated by efficient deterministic algorithms. The celebrated "hardness v.s. randomness” paradigm pioneered by Blum-Micali (SIAM JoC’84), Yao (FOCS’84) and Nisan-Wigderson (JCSS’94) presents hardness assumptions under which e.g., prBPP = prP (so-called "high-end derandomization), or prBPP ⊆ prSUBEXP (so-called "low-end derandomization), and more generally, under which prBPP ⊆ prDTIME(𝒞) where 𝒞 is a "nice" class (closed under composition with a polynomial), but these hardness assumptions are not known to also be necessary for such derandomization. In this work, following the recent work by Chen and Tell (FOCS’21) that considers "almost-all-input" hardness of a function f (i.e., hardness of computing f on more than a finite number of inputs), we consider "almost-all-input" leakage-resilient hardness of a function f - that is, hardness of computing f(x) even given, say, √|x| bits of leakage of f(x). We show that leakage-resilient hardness characterizes derandomization of prBPP (i.e., gives a both necessary and sufficient condition for derandomization), both in the high-end and in the low-end setting. In more detail, we show that there exists a constant c such that for every function T, the following are equivalent: - prBPP ⊆ prDTIME(poly(T(poly(n)))); - Existence of a poly(T(poly(n)))-time computable function f :{0,1}ⁿ → {0,1}ⁿ that is almost-all-input leakage-resilient hard with respect to n^c-time probabilistic algorithms. As far as we know, this is the first assumption that characterizes derandomization in both the low-end and the high-end regime. Additionally, our characterization naturally extends also to derandomization of prMA, and also to average-case derandomization, by appropriately weakening the requirements on the function f. In particular, for the case of average-case (a.k.a. "effective") derandomization, we no longer require the function to be almost-all-input hard, but simply satisfy the more standard notion of average-case leakage-resilient hardness (w.r.t., every samplable distribution), whereas for derandomization of prMA, we instead consider leakage-resilience for relations.

Cite as

Yanyi Liu and Rafael Pass. Leakage-Resilient Hardness vs Randomness. In 38th Computational Complexity Conference (CCC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 264, pp. 32:1-32:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.CCC.2023.32,
  author =	{Liu, Yanyi and Pass, Rafael},
  title =	{{Leakage-Resilient Hardness vs Randomness}},
  booktitle =	{38th Computational Complexity Conference (CCC 2023)},
  pages =	{32:1--32:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-282-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{264},
  editor =	{Ta-Shma, Amnon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2023.32},
  URN =		{urn:nbn:de:0030-drops-183022},
  doi =		{10.4230/LIPIcs.CCC.2023.32},
  annote =	{Keywords: Derandomization, Leakage-Resilient Hardness}
}
Document
Characterizing Derandomization Through Hardness of Levin-Kolmogorov Complexity

Authors: Yanyi Liu and Rafael Pass

Published in: LIPIcs, Volume 234, 37th Computational Complexity Conference (CCC 2022)


Abstract
A central open problem in complexity theory concerns the question of whether all efficient randomized algorithms can be simulated by efficient deterministic algorithms. We consider this problem in the context of promise problems (i.e,. the prBPP v.s. prP problem) and show that for all sufficiently large constants c, the following are equivalent: - prBPP = prP. - For every BPTIME(n^c) algorithm M, and every sufficiently long z ∈ {0,1}ⁿ, there exists some x ∈ {0,1}ⁿ such that M fails to decide whether Kt(x∣z) is "very large" (≥ n-1) or "very small" (≤ O(log n)). where Kt(x∣z) denotes the Levin-Kolmogorov complexity of x conditioned on z. As far as we are aware, this yields the first full characterization of when prBPP = prP through the hardness of some class of problems. Previous hardness assumptions used for derandomization only provide a one-sided implication.

Cite as

Yanyi Liu and Rafael Pass. Characterizing Derandomization Through Hardness of Levin-Kolmogorov Complexity. In 37th Computational Complexity Conference (CCC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 234, pp. 35:1-35:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.CCC.2022.35,
  author =	{Liu, Yanyi and Pass, Rafael},
  title =	{{Characterizing Derandomization Through Hardness of Levin-Kolmogorov Complexity}},
  booktitle =	{37th Computational Complexity Conference (CCC 2022)},
  pages =	{35:1--35:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-241-9},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{234},
  editor =	{Lovett, Shachar},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2022.35},
  URN =		{urn:nbn:de:0030-drops-165975},
  doi =		{10.4230/LIPIcs.CCC.2022.35},
  annote =	{Keywords: Derandomization, Kolmogorov Complexity, Hitting Set Generators}
}
Document
On One-Way Functions from NP-Complete Problems

Authors: Yanyi Liu and Rafael Pass

Published in: LIPIcs, Volume 234, 37th Computational Complexity Conference (CCC 2022)


Abstract
We present the first natural NP-complete problem whose average-case hardness w.r.t. the uniform distribution over instances is equivalent to the existence of one-way functions (OWFs). The problem, which originated in the 1960s, is the Conditional Time-Bounded Kolmogorov Complexity Problem: let K^t(x∣z) be the length of the shortest "program" that, given the "auxiliary input" z, outputs the string x within time t(|x|), and let McK^tP[ζ] be the set of strings (x,z,k) where |z| = ζ(|x|), |k| = log |x| and K^t(x∣z) < k, where, for our purposes, a "program" is defined as a RAM machine. Our main result shows that for every polynomial t(n) ≥ n², there exists some polynomial ζ such that McK^tP[ζ] is NP-complete. We additionally extend the result of Liu-Pass (FOCS'20) to show that for every polynomial t(n) ≥ 1.1n, and every polynomial ζ(⋅), mild average-case hardness of McK^tP[ζ] is equivalent to the existence of OWFs. Taken together, these results provide the following crisp characterization of what is required to base OWFs on NP ⊈ BPP: There exists concrete polynomials t,ζ such that "Basing OWFs on NP ⊈ BPP" is equivalent to providing a "worst-case to (mild) average-case reduction for McK^tP[ζ]". In other words, the "holy-grail" of Cryptography (i.e., basing OWFs on NP ⊈ BPP) is equivalent to a basic question in algorithmic information theory. As an independent contribution, we show that our NP-completeness result can be used to shed new light on the feasibility of the polynomial-time bounded symmetry of information assertion (Kolmogorov'68).

Cite as

Yanyi Liu and Rafael Pass. On One-Way Functions from NP-Complete Problems. In 37th Computational Complexity Conference (CCC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 234, pp. 36:1-36:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.CCC.2022.36,
  author =	{Liu, Yanyi and Pass, Rafael},
  title =	{{On One-Way Functions from NP-Complete Problems}},
  booktitle =	{37th Computational Complexity Conference (CCC 2022)},
  pages =	{36:1--36:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-241-9},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{234},
  editor =	{Lovett, Shachar},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2022.36},
  URN =		{urn:nbn:de:0030-drops-165981},
  doi =		{10.4230/LIPIcs.CCC.2022.36},
  annote =	{Keywords: One-way Functions, NP-Completeness, Kolmogorov Complexity}
}
Document
Complete Volume
OASIcs, Volume 97, Tokenomics 2021, Complete Volume

Authors: Vincent Gramoli, Hanna Halaburda, and Rafael Pass

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
OASIcs, Volume 97, Tokenomics 2021, Complete Volume

Cite as

3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 1-124, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Proceedings{gramoli_et_al:OASIcs.Tokenomics.2021,
  title =	{{OASIcs, Volume 97, Tokenomics 2021, Complete Volume}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{1--124},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021},
  URN =		{urn:nbn:de:0030-drops-158965},
  doi =		{10.4230/OASIcs.Tokenomics.2021},
  annote =	{Keywords: OASIcs, Volume 97, Tokenomics 2021, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Vincent Gramoli, Hanna Halaburda, and Rafael Pass

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 0:i-0:x, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{gramoli_et_al:OASIcs.Tokenomics.2021.0,
  author =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{0:i--0:x},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.0},
  URN =		{urn:nbn:de:0030-drops-158975},
  doi =		{10.4230/OASIcs.Tokenomics.2021.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Hybrid Consensus: Efficient Consensus in the Permissionless Model

Authors: Rafael Pass and Elaine Shi

Published in: LIPIcs, Volume 91, 31st International Symposium on Distributed Computing (DISC 2017)


Abstract
Consensus, or state machine replication is a foundational building block of distributed systems and modern cryptography. Consensus in the classical, "permissioned" setting has been extensively studied in the 30 years of distributed systems literature. Recent developments in Bitcoin and other decentralized cryptocurrencies popularized a new form of consensus in a "permissionless" setting, where anyone can join and leave dynamically, and there is no a-priori knowledge of the number of consensus nodes. So far, however, all known permissionless consensus protocols assume network synchrony, i.e., the protocol must know an upper bound of the network's delay, and transactions confirm slower than this a-priori upper bound. We initiate the study of the feasibilities and infeasibilities of achieving responsiveness in permissionless consensus. In a responsive protocol, the transaction confirmation time depends only on the actual network delay, but not on any a-priori known upper bound such as a synchronous round. Classical protocols in the partial synchronous and asynchronous models naturally achieve responsiveness, since the protocol does not even know any delay upper bound. Unfortunately, we show that in the permissionless setting, consensus is impossible in the asynchronous or partially synchronous models. On the positive side, we construct a protocol called Hybrid Consensus by combining classical-style and blockchain-style consensus. Hybrid Consensus shows that responsiveness is nonetheless possible to achieve in permissionless consensus (assuming proof-of-work) when 1) the protocol knows an upper bound on the network delay; 2) we allow a non-responsive warmup period after which transaction confirmation can become responsive; 3) honesty has some stickiness, i.e., it takes a short while for an adversary to corrupt a node or put it to sleep; and 4) less than 1/3 of the nodes are corrupt. We show that all these conditions are in fact necessary - if only one of them is violated, responsiveness would have been impossible. Our work makes a step forward in our understanding of the permissionless model and its differences and relations to classical consensus.

Cite as

Rafael Pass and Elaine Shi. Hybrid Consensus: Efficient Consensus in the Permissionless Model. In 31st International Symposium on Distributed Computing (DISC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 91, pp. 39:1-39:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{pass_et_al:LIPIcs.DISC.2017.39,
  author =	{Pass, Rafael and Shi, Elaine},
  title =	{{Hybrid Consensus: Efficient Consensus in the Permissionless Model}},
  booktitle =	{31st International Symposium on Distributed Computing (DISC 2017)},
  pages =	{39:1--39:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-053-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{91},
  editor =	{Richa, Andr\'{e}a},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.DISC.2017.39},
  URN =		{urn:nbn:de:0030-drops-80040},
  doi =		{10.4230/LIPIcs.DISC.2017.39},
  annote =	{Keywords: Distributed Consensus, Permissionless, Responsiveness}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail